Hierarchical Tensor Decomposition of Latent Tree Graphical Models

نویسندگان

  • Le Song
  • Mariya Ishteva
  • Ankur P. Parikh
  • Eric P. Xing
  • Haesun Park
چکیده

We approach the problem of estimating the parameters of a latent tree graphical model from a hierarchical tensor decomposition point of view. In this new view, the marginal probability table of the observed variables is treated as a tensor, and we show that: (i) the latent variables induce low rank structures in various matricizations of the tensor; (ii) this collection of low rank matricizations induces a hierarchical low rank decomposition of the tensor. We further derive an optimization problem for estimating (alternative) parameters of a latent tree graphical model, allowing us to represent the marginal probability table of the observed variables in a compact and robust way. The optimization problem aims to find the best hierarchical low rank approximation of a tensor in Frobenius norm. For correctly specified latent tree graphical models, we show that a global optimum of the optimization problem can be obtained via a recursive decomposition algorithm. This algorithm recovers previous spectral algorithms for hidden Markov models (Hsu et al., 2009; Foster et al., 2012) and latent tree graphical models (Parikh et al., 2011; Song et al., 2011) as special cases, elucidating the global objective these algorithms are optimizing. For misspecified latent tree graphical models, we derive a novel decomposition based on our framework, and provide approximation guarantee and computational complexity analysis. In both synthetic and real world data, this new estimator significantly improves over the state-of-the-art. Proceedings of the 30 th International Conference on Machine Learning, Atlanta, Georgia, USA, 2013. JMLR: W&CP volume 28. Copyright 2013 by the author(s).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Inference in Graphical Models using Tensor Decompositions

We demonstrate that tensor decompositions can be used to transform graphical models into structurally simpler graphical models that approximate the same joint probability distribution. In this way, standard inference algorithms such as the junction tree algorithm, can be used in order to use the transformed graphical model for approximate inference. The usefulness of the technique is demonstrat...

متن کامل

Discovery of Latent Factors in High-dimensional Data Using Tensor Methods

OF THE DISSERTATIONDiscovery of Latent Factors in High-dimensional Data Using Tensor MethodsByFurong HuangDoctor of Philosophy in Electrical and Computer EngineeringUniversity of California, Irvine, 2016Assistant Professor Animashree Anandkumar, Chair Unsupervised learning aims at the discovery of hidden structure that drives the observationsin the real world. It is ...

متن کامل

Kernel Embeddings of Latent Tree Graphical Models

Latent tree graphical models are natural tools for expressing long range and hierarchical dependency among many variables which are common in computer vision, bioinformatics and natural language processing problems. However, existing models are largely restricted to discrete and Gaussian variables due to computational constraints; furthermore, algorithms for estimating the latent tree structure...

متن کامل

Nonparametric Latent Tree Graphical Models: Inference, Estimation, and Structure Learning

Tree structured graphical models are powerful at expressing long range or hierarchical dependency among many variables, and have been widely applied in different areas of computer science and statistics. However, existing methods for parameter estimation, inference, and structure learning mainly rely on the Gaussian or discrete assumptions, which are restrictive under many applications. In this...

متن کامل

Tensor Belief Propagation

We propose a new approximate inference algorithm for graphical models, tensor belief propagation, based on approximating the messages passed in the junction tree algorithm. Our algorithm represents the potential functions of the graphical model and all messages on the junction tree compactly as mixtures of rank-1 tensors. Using this representation, we show how to perform the operations required...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013